Multi-Scale Distillation from Multiple Graph Neural Networks

نویسندگان

چکیده

Knowledge Distillation (KD), which is an effective model compression and acceleration technique, has been successfully applied to graph neural networks (GNNs) recently. Existing approaches utilize a single GNN as the teacher distill knowledge. However, we notice that models with different number of layers demonstrate classification abilities on nodes degrees. On one hand, for high degrees, their local structures are dense complex, hence more message passing needed. Therefore, perform better. other low whose relatively sparse simple, repeated can easily lead over-smoothing. Thus, less suitable. existing single-teacher knowledge distillation based model, sub-optimal. To this end, propose novel approach multi-scale knowledge, learns from multiple capture topological semantic at scales. Instead learning equally, proposed method automatically assigns proper weights each via attention mechanism enables student select teachers structures. Extensive experiments conducted evaluate four public datasets. The experimental results superiority our over state-of-the-art methods. Our code publicly available https://github.com/NKU-IIPLab/MSKD.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rejection of the Feed-Flow Disturbances in a Multi-Component Distillation Column Using a Multiple Neural Network Model-Predictive Controller

This article deals with the issues associated with developing a new design methodology for the nonlinear model-predictive control (MPC) of a chemical plant. A combination of multiple neural networks is selected and used to model a nonlinear multi-input multi-output (MIMO) process with time delays.  An optimization procedure for a neural MPC algorithm based on this model is then developed. T...

متن کامل

Geometric Matrix Completion with Recurrent Multi-Graph Neural Networks

Matrix completion models are among the most common formulations of recommender systems. Recent works have showed a boost of performance of these techniques when introducing the pairwise relationships between users/items in the form of graphs. Such techniques do not fully exploit the local stationary structures of user/item graphs, and the number of parameters to learn is linear w.r.t. the numbe...

متن کامل

Artificial Neural Networks for Identification and Control of a Lab-Scale Distillation Column Using LABVIEW

LABVIEW is a graphical programming language that has its roots in automation control and data acquisition. In this paper we have utilized this platform to provide a powerful toolset for process identification and control of nonlinear systems based on artificial neural networks (ANN). This tool has been applied to the monitoring and control of a lab-scale distillation column DELTALAB DC-SP. The ...

متن کامل

Large Scale Distributed Neural Network Training through Online Distillation

Techniques such as ensembling and distillation promise model quality improvements when paired with almost any base model. However, due to increased testtime cost (for ensembles) and increased complexity of the training pipeline (for distillation), these techniques are challenging to use in industrial settings. In this paper we explore a variant of distillation which is relatively straightforwar...

متن کامل

Large Scale Distributed Neural Network Training through Online Distillation

Techniques such as ensembling and distillation promise model quality improvements when paired with almost any base model. However, due to increased testtime cost (for ensembles) and increased complexity of the training pipeline (for distillation), these techniques are challenging to use in industrial settings. In this paper we explore a variant of distillation which is relatively straightforwar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2022

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v36i4.20354